Core Element - Selection Guide for Coordinate Measuring Machine Special Measuring Needles
When choosing the most suitable measuring needle for your needs, there are some important factors to consider.
Factors to Consider in Needle Selection
When evaluating how accurate the measurement results of a coordinate measuring machine need to be, the usual practice is to use a coordinate measuring machine uncertainty and characteristic tolerance ratio of at least 1:5 (1:10 is an ideal ratio, but too expensive and impractical in many situations). This ratio provides a safety factor to ensure that the ratio of measurement result uncertainty to expected workpiece error range is relatively small. As long as a 1:5 ratio can be maintained within extremely strict tolerance levels, the debate over accuracy will cease.
Unfortunately, insignificant operations such as replacing the measuring needle on the probe can have a significant impact on the actual accuracy that may be achieved, resulting in significant changes in the measurement results. Relying on the annual calibration check of coordinate measuring machines is not sufficient for this accuracy, as it can only confirm the measurement results of the measuring needles (usually very short) used for testing. This may just be the best accuracy. In order to have a more comprehensive understanding of the possible accuracy of various measurements, we need to evaluate how the measuring needle affects the measurement uncertainty.
This section will elaborate on the four main aspects of needle selection that affect the overall accuracy of coordinate measuring machines:
1. Measure the sphericity (roundness) of the ball
2. Deformation of measuring needle
3. Thermal stability
4. Selection of tip material (scanning application)
Measuring sphericity (roundness)
The tip of most measuring needles is a ball head, and the most common material is synthetic ruby. Any error in measuring the sphericity (roundness) of such a measuring point can result in a decrease of up to 10% in the accuracy of the coordinate measuring machine.
There are various precision levels defined as "grades" for ruby measuring balls, which refer to the maximum deviation between the measuring ball and the ideal spherical surface. The two most commonly used ball measurement indicators are level 5 and level 10 (the lower the level number, the better the ball measurement). The ball level has been reduced from level 5 to level 10, which may save some costs for the measuring needle, but it is highly likely to affect the theory of the so-called 1:5 ratio.
The problem is that the ball level cannot be recognized with the naked eye, and its role in the measurement results is not obvious, making it difficult to estimate whether it is important. One approach is to designate Level 5 test balls as standard configuration: the cost of this type of test ball may be slightly higher, but compared to the high risk of qualified parts becoming scrap due to test ball reasons or mistakenly testing unqualified parts as qualified parts, this cost is negligible. It is illogical that the higher the accuracy of a coordinate measuring machine, the greater the impact of the measuring ball level. On the highest specification coordinate measuring machine, this effect will weaken the accuracy by 10%.
Please see the following example
Typical measurement error in accordance with ISO 10360-2 (MPEp), measured with a 5-level measuring needle:
· MPEp = 1.70 μm
This number is obtained by measuring 25 discrete points, each estimated as 25 individual radii. The range of radius variation is the MPEp value. The roundness of the measuring ball has a direct impact on this, and in this example, replacing the 5-level measuring ball with a 10 level one increased the value by 0.12 μ m and increased the measurement error by 7%:
· MPEp = 1.82μm
Please note: Measuring the roundness of the ball will also affect itMPETHPHave an impact,MPETHPEvaluate the performance of the scanning probe using four scanning paths on the sphere.
Note:
· Level 5 ball sphericity=0.13 μ m
· Level 10 ball sphericity=0.25 μ m
For applications with extremely strict requirements, Renishaw offers a range of measuring needles equipped with 3-level measuring balls with a sphericity of only 0.08 μ m.
Deformation of measuring needle
When using trigger probes (such as the industry standard TP20), a common practice is to exchange probe modules and use optimized probes to perform different measurement tasks. The reason why long probes are not used in all feature measurements is that the longer the probe, the greater the loss of accuracy. A good method is to choose probes that are as short and rigid as possible - but why?
Although the measuring needle is not the direct cause of this specific error, the error does increase with the length of the measuring needle. The error arises from the need to measure different forces of the probe in various directions. Most probes are not triggered at the moment of contact between the probe and the workpiece; They need to continuously increase the force to exceed the spring load inside the sensor mechanism. This elasticity forces the measuring needle to deform. This deformation allows the probe to move a short distance relative to the workpiece after physical contact occurs and before triggering. This movement is known as the pre trigger release process.
The three-point mechanical positioning mechanism of most probes can provide different force measurements to generate triggers according to requirements. In the direction of greater rigidity, the probe will obstruct triggering until a larger deformation of the probe occurs. This also means that the coordinate measuring machine will move further, so the pre touch release process varies depending on the advance angle (see figure on the right). When using composite approach angles (X, Y, and Z axes), the stroke changes of this pre trigger are more complex.
· The pre trigger release process variation of the trigger probe
To reduce this impact, all measuring needles have been calibrated on a standard ball of known size before use. In an ideal situation, this process will correct the errors caused by the merging of the measuring needle and the advancing angle. In practice, in order to save time, only a few corners are randomly inspected and the average is taken, so a small amount of error may still exist.
Without empirical testing, it is difficult to estimate the impact of these errors on measurement uncertainty. The key factor to note is that any residual pre touch release process variation error will be affected by different probe choices. The emphasis here is on the importance of material selection in the design of measuring needles, such as the torsion resistance, rigidity, weight, and cost of the measuring rod. Steel is suitable for many shorter measuring needles, with a Young modulus of E=210kN/mm ². The most rigid material commonly used is tungsten carbide (E=620kN/mm ²), but this material has a high density and is therefore rarely used for long measuring needles. In these examples, carbon fiber combines strong rigidity (E=450kN/mm ²) with lightweight characteristics. At the same time, ceramic measuring rods (E=300-400kN/mm ²) are commonly used in machine tool measurement applications, which have the advantages of light weight and high thermal stability.
The rigidity of the measuring needle is also affected by the adapter of the measuring needle component. As a guiding principle, it is best to avoid using adapters as much as possible, as they can cause hysteresis. However, adapters may not be avoided when using fixed sensors to measure complex workpieces. In these cases, a configuration consisting of a series of measuring needles, extension rods, connectors, and joints may be required. Once again, it is crucial to carefully consider the material of the measuring needle, as this can have an impact on the rigidity, weight, and robustness of the needle configuration.
thermal stability
Temperature changes may lead to serious measurement errors. Choose the correct material for the measuring needle extension rod to ensure better stability even under temperature changes and achieve more reliable measurement results. Materials with low thermal expansion coefficients are more desirable, especially when using long measuring needles, as the amount of thermal expansion is related to the length of the measuring needle.
As mentioned earlier, carbon fiber is the most commonly used material for long measuring needles and extension rods because it has strong rigidity, light weight, and its length does not change with temperature. In situations where metal materials are needed, such as joints, joints, etc., titanium provides a perfect combination of strength, stability, and density. Renishaw offers probes and probe extension rods made from these two materials.
· Coefficient of thermal expansion
Selection of tip material (scanning application)
For most applications, ruby measuring balls are the default option for measuring tips. But in some cases, other materials may provide better options.
In trigger measurement, the measuring tip only makes short-term contact with the surface of the workpiece without relative movement. Scanning is different because the measuring ball slides along the surface of the workpiece and causes friction and wear. In adverse conditions, this continuous contact may cause the workpiece material to detach or adhere to the measuring ball, thereby affecting the sphericity of the measuring ball. If a certain part of the measuring ball continues to come into contact with the workpiece, these effects will increase. Renishaw conducted extensive research on these effects, with a focus on identifying two types of wear:
Friction and wear
The reason for friction and wear is that when scanning surfaces (such as cast iron surfaces), tiny residual particles may cause small scratches on the spherical measuring needle and workpiece surface, resulting in small "shallow pits" on the measuring tip. Hard zirconia measuring tips are the best choice for these applications.
Adhesive wear
Adhesive wear occurs when there is a chemical affinity between the measuring ball and the workpiece. This phenomenon may occur when scanning aluminum workpieces with ruby (zirconia) measuring balls. The material is transferred from a softer workpiece to the measuring needle, resulting in the formation of an aluminum coating on the measuring tip, which in turn affects its roundness. In this case, the best choice is silicon nitride because it has good wear resistance and does not adsorb with aluminum.
Other factors
Other factors to consider when choosing a measuring needle include:
· Measure the thread size of the measuring needle to match the selected sensor
· Needle type - direct needle, star shaped needle, rotatable needle or custom configuration
· Measurement tip types - spherical, cylindrical, disc-shaped, hemispherical
· Measure the tip size to minimize the impact of surface roughness on measurement accuracy